Alternating direction method of multipliers for penalized zero-variance discriminant analysis
نویسندگان
چکیده
منابع مشابه
Alternating direction method of multipliers for penalized zero-variance discriminant analysis
We consider the task of classification in the high-dimensional setting where the number of features of the given data is significantly greater than the number of observations. To accomplish this task, we propose sparse zero-variance discriminant analysis (SZVD) as a method for simultaneously performing linear discriminant analysis and feature selection on high-dimensional data. This method comb...
متن کاملAlternating direction method of multipliers for sparse zero-variance discriminant analysis and principal component analysis
We consider the task of classification in the high-dimensional setting where the number of features of the given data is significantly greater than the number of observations. To accomplish this task, we propose sparse zero-variance discriminant analysis (SZVD) as a method for simultaneously performing linear discriminant analysis and feature selection on high-dimensional data. This method comb...
متن کاملBregman Alternating Direction Method of Multipliers
The mirror descent algorithm (MDA) generalizes gradient descent by using a Bregman divergence to replace squared Euclidean distance. In this paper, we similarly generalize the alternating direction method of multipliers (ADMM) to Bregman ADMM (BADMM), which allows the choice of different Bregman divergences to exploit the structure of problems. BADMM provides a unified framework for ADMM and it...
متن کاملAdaptive Stochastic Alternating Direction Method of Multipliers
The Alternating Direction Method of Multipliers (ADMM) has been studied for years. Traditional ADMM algorithms need to compute, at each iteration, an (empirical) expected loss function on all training examples, resulting in a computational complexity proportional to the number of training examples. To reduce the complexity, stochastic ADMM algorithms were proposed to replace the expected loss f...
متن کاملFast Stochastic Alternating Direction Method of Multipliers
In this paper, we propose a new stochastic alternating direction method of multipliers (ADMM) algorithm, which incrementally approximates the full gradient in the linearized ADMM formulation. Besides having a low per-iteration complexity as existing stochastic ADMM algorithms, the proposed algorithm improves the convergence rate on convex problems from O ( 1 √ T ) to O ( 1 T ) , where T is the ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Optimization and Applications
سال: 2016
ISSN: 0926-6003,1573-2894
DOI: 10.1007/s10589-016-9828-y